32 PART 1 Getting Started with Biostatistics

than stay in a read a book. Because the likelihood of having a picnic is conditional

on whether or not it rains, raining and having a picnic are not independent events,

and these probability rules cannot apply.

Comparing odds versus probability

You see the word odds used a lot in this book, especially in Chapter 13, which is

about the fourfold cross-tab (contingency) table, and Chapter 18, which is about

logistic regression. The terms odds and probability are linked, but they actually

mean something different. Imagine you hear that a casino customer places a bet

because the odds of losing are 2-to-1. If you ask them why they are doing that,

they will tell you that such a bet wins — on average — one out of every three

times, which is an expression of probability. We will examine how this works

using formulas.

The odds of an event equals the probability of the event occurring divided by the

probability of that event not occurring. We already know we can calculate the prob-

ability of the event not occurring by subtracting the probability of the event occur-

ring from 1 (as described in the previous section). With that in mind, you can

express odds in terms of probability in the following formula:

Odds

Probability/

Probability

1

With a little algebra (which you don’t need to worry about), you can solve this

formula for probability as a function of odds:

Probability

Odds/

Odds

1

Returning to the casino example, if the customer says their odds of losing are

2-to-1, they mean 2/1, which equals 2. If we plug the odds of 2 into the second

equation, we get 2/(1+2), which is 2/3, which can be rounded to 0.6667. The cus-

tomer is correct — they will lose two out of every three times, and win one out of

every three times, on average.

Table 3-1 shows how probability and odds are related.

As shown in Table 3-1, for very low probabilities, the odds are very close to the prob-

ability. But as probability increases, the odds increase exponentially. By the time

probability reaches 0.5, the odds have become 1, and as probability approaches 1, the

odds become infinitely large! This definition of odds is consistent with its

­common-language use. As described earlier with the casino example, if the odds

of a horse losing a race are 3:1, that means if you bet on this horse, you have three

chances of losing and one chance of winning, for a 0.75 probability of losing.